Preserving Neural Function under Extreme Scaling
نویسندگان
چکیده
Important brain functions need to be conserved throughout organisms of extremely varying sizes. Here we study the scaling properties of an essential component of computation in the brain: the single neuron. We compare morphology and signal propagation of a uniquely identifiable interneuron, the HS cell, in the blowfly (Calliphora) with its exact counterpart in the fruit fly (Drosophila) which is about four times smaller in each dimension. Anatomical features of the HS cell scale isometrically and minimise wiring costs but, by themselves, do not scale to preserve the electrotonic behaviour. However, the membrane properties are set to conserve dendritic as well as axonal delays and attenuation as well as dendritic integration of visual information. In conclusion, the electrotonic structure of a neuron, the HS cell in this case, is surprisingly stable over a wide range of morphological scales.
منابع مشابه
Outlier Detection Using Extreme Learning Machines Based on Quantum Fuzzy C-Means
One of the most important concerns of a data miner is always to have accurate and error-free data. Data that does not contain human errors and whose records are full and contain correct data. In this paper, a new learning model based on an extreme learning machine neural network is proposed for outlier detection. The function of neural networks depends on various parameters such as the structur...
متن کاملOn multidimensional scaling and the embedding of self-organising maps
The self-organising map (SOM) and its variant, visualisation induced SOM (ViSOM), have been known to yield similar results to multidimensional scaling (MDS). However, the exact connection has not been established. In this paper, a review on the SOM and its cost function and topological measures is provided first. We then examine the exact scaling effect of the SOM and ViSOM from their objective...
متن کاملDizzyRNN: Reparameterizing Recurrent Neural Networks for Norm-Preserving Backpropagation
The vanishing and exploding gradient problems are well-studied obstacles that make it difficult for recurrent neural networks to learn long-term time dependencies. We propose a reparameterization of standard recurrent neural networks to update linear transformations in a provably norm-preserving way through Givens rotations. Additionally, we use the absolute value function as an element-wise no...
متن کاملMethods for Binary Multidimensional Scaling
Multidimensional scaling (MDS) is the process of transforming a set of points in a high-dimensional space to a lower-dimensional one while preserving the relative distances between pairs of points. Although effective methods have been developed for solving a variety of MDS problems, they mainly depend on the vectors in the lower-dimensional space having real-valued components. For some applicat...
متن کاملScale Normalization
One of the difficulties of training deep neural networks is caused by improper scaling between layers. Scaling issues introduce exploding / gradient problems, and have typically been addressed by careful scale-preserving initialization. We investigate the value of preserving scale, or isometry, beyond the initial weights. We propose two methods of maintaing isometry, one exact and one stochasti...
متن کامل